AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
128K Long Context Processing

# 128K Long Context Processing

Ling Lite 1.5
MIT
Ling is a large-scale Mixture of Experts (MoE) language model open-sourced by InclusionAI. The Lite version features 16.8 billion total parameters with 2.75 billion activated parameters, demonstrating exceptional performance.
Large Language Model Transformers
L
inclusionAI
46
3
Qwen3 235B A22B 128K GGUF
Apache-2.0
Qwen3 is the latest generation large language model in the Tongyi Qianwen series, offering a complete suite of dense and Mixture of Experts (MoE) models. Based on large-scale training, Qwen3 has achieved breakthrough progress in reasoning, instruction following, agent capabilities, and multilingual support.
Large Language Model English
Q
unsloth
310.66k
26
C4ai Command R Plus Fp8
C4AI Command R+ is an open-weight 104 billion parameter research model with advanced capabilities, including Retrieval-Augmented Generation (RAG) and tool usage for automating complex tasks.
Large Language Model Transformers Supports Multiple Languages
C
FriendliAI
35
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase